Data Structures and Algorithms
GitHub Toggle Dark/Light/Auto mode Toggle Dark/Light/Auto mode Toggle Dark/Light/Auto mode Back to homepage

Asymptotic Notation

Consider the example: Given number nn , write a function to find sum of first nn natural numbers.

Some solutions (Python):

  1.  def sum1(n):
       return n * (n + 1) / 2
    

    Total work done: c1c_{1} and it is not dependent on nn .

  2. def sum2(n):
        sum = 0
        for n in range(n + 1):
            sum += n
        return sum
    

    Total work done: Some constant work and a loop that executes n times: c2n+c3c_{2}n + c_{3} .

  3.   def sum3(n):
          sum = 0
          for i in range(1, n + 1):
              for j in range(1, i + 1):
                  sum += 1
          return sum
    

    The inner loop executes:

    • once for i = 1
    • twice for i = 2
    • thrice for i = 3

    So the total work done is:

    1+2+3+...+n=n(n+1)/2=c4n2+c5n+c6 1 + 2 + 3 + ... + n \\ = n * (n + 1)/2 \\ = c_{4}{n^2} + c_{5}n + c_{6}

Asymptotic Analysis

It’s a measure of the order of growth of an algorithm in terms of its input size.

Let’s compare the growth of solutions 1 and 2.

Assuming the values of constants as per the figure below, let’s find nn .

Analysis

2n+510n2.532n + 5 \geqslant 10 \therefore n \geqslant 2.5 \approx 3

So, for any n>=3n >= 3 the constant function sum1() will always be faster than sum2().

Order of growth

A function f(n)f(n) is said to be growing faster than g(n)g(n) if

for n0,f(n),g(n)0limng(n)f(n)=0\text {for $n \geqslant 0, f(n), g(n) \geqslant 0 $} \\ \lim\limits_{n \to \infty} \frac{g(n)}{f(n)} = 0

Example:

f(n)=n2+n+6g(n)=2n+5limn2n+5n2+n+6dividing by highest term i.e. n2limn2/n+5/n21+1/n+6/n2with n tending to limn0+01+0+0=0f(n) = n^2 + n + 6 \hspace{20px} g(n) = 2n + 5 \\ \lim\limits_{n \to \infty} \frac{2n + 5} {n^2 + n + 6} \\ \small \text{dividing by highest term i.e. } n^2 \hspace{10px} \lim\limits_{n \to \infty} \frac{2/n + 5/n^2} {1 + 1/n + 6/n^2} \\ \small \text{with n tending to } \infty \hspace{10px} \lim\limits_{n \to \infty} \frac{0 + 0}{1 + 0 + 0} = 0

So, if f(n)f(n) and g(n)g(n) represent time complexity of algorithms i.e. order of growth, g(n)g(n) is a better algorithm.

Direct way to find out the order of growth

  1. Ignore lower order terms.
  2. Ignore leading constants.

Faster growing function dominates a slower growing one.

Common dominance relations: C<loglog n<log n<n1/3<n1/2<n<n2<n3<2n<nnC < \text {loglog } n < \text {log } n < n^{1/3}< n^{1/2} < n < n^2 < n^3 < 2^n < n^n

Asymptotic Notations and Best, Average and Worst Cases

Best, Average and Worst Cases

Let’s Consider some examples:

  1. def nsum(arr, n):
      sum = 0
      for i in range(n):
        sum += arr[i]
      return sum
    

    This function is linear.

  2.  def nsum(arr, n):
       if n % 2 != 0:
         return 0
       sum = 0
       for i in range(n):
         sum += arr[n]
       return sum
    

    Best Case: When n is odd, it’s going to take constant time.
    Average Case: Often it’s impractical to calculate unless you know all the inputs that will be provided to the algorithm all the time.
    Worst Case: When n is even it will be linear.

Worst Case is considered the most important case for algorithm analysis.

Cases are not related to notations. You can use the any notation for any case.

Asymptotic Notations

Big O: Represents exact or upper bound i.e. order of growth.
Big Theta (𝛳): Represents exact bound.
Big Omega (Ω): Represents exact or lower bound.

Big O is the most common notation used.

Big O Notation

f(n)=O(g(n))f(n) = O(g(n)) if and only if there are exact constants cc and n0n_0 such that f(n)cg(n)f(n) \leqslant cg(n) for all nn0n \geqslant n_0 .

In simple terms, we want to find a function g(n)g(n) that is always going to be equal to or greater than f(n)f(n) when multiplied by a constant for large values of nn .

Big O

Example:

f(n)=2n+3f(n) = 2n + 3 can be written as O(n)O(n) after ignoring co-efficient of highest-growing term and lower-order terms.

Since f(n)O(g(n)f(n) \leqslant O(g(n) , equating it to above gives us g(n)=ng(n) = n .

Let’s prove it mathematically:

f(n)cg(n)  nn0i.e. (2n+3)cg(n)i.e. (2n+3)cn g(n)=nf(n) \leqslant cg(n) \space \forall \space n \geqslant n_0 \\ \text{i.e.}\space(2n + 3) \leqslant cg(n) \\ \text{i.e.}\space(2n + 3) \leqslant cn \space \because g(n) = n

Quick way to find the value of c is take leading constant of highest growing term and add 1.

c=3Substituting c to find the value of n02n+33n3nn0=3\therefore c = 3 \\ \small \text{Substituting} \space c \space \text{to find the value of } n_0 \\ 2n + 3 \leqslant 3n \\ 3 \leqslant n \therefore n_0 = 3 \\

So for all values of n 3n \space \geqslant 3 , the equation 2n+33n2n+3 \leqslant 3n holds true.

If we try putting some values, say n=4n = 4 in above equation, we can observe it holds true. Hence proved.

Some more examples:

  1. {n4,2n+3,n100+logn,100,logn,...}O(n)\{\frac{n}{4}, 2n + 3, \frac{n}{100} + \log n, 100, \log n, ...\} \in O(n)
  2. {n2+n,2n2,n2100,...}O(n2)\{n^2 + n, 2n^2, \frac{n^2}{100}, ...\} \in O(n^2)

Since Big O is upper bound, all functions in 1 can be said to belong to 2, but it helps to use tight bounds.

Big O gives the upper bound. If we say an algorithm is linear, then the algorithm in question is

O(n) \leqslant O(n) . So, it's going to perform linearly in worst case scenario or better. Therefore Big O is the upper bound of the algorithm.

Big Ω Notation

f(n)=Ω(g(n))f(n) = \Omega(g(n)) iff there are exact constants cc and n0n_0 such that 0cg(n)f(n)0 \leqslant cg(n) \leqslant f(n) for all nn0n \geqslant n_0 .
  • Big Omega is exact opposite of Big O.
  • Big Omega gives the lower bound of an algorithm i.e. the algorithm will perform at least or better than it.
  • Example - f(n)=2n+3=Ω(n)f(n) = 2n + 3 = \Omega(n)
  • {n4,n2,2n,2n+3,n2...}Ω(n)\{\frac{n}{4}, \frac{n}{2}, 2n, 2n+3, n^2 ...\} \in \Omega(n) i.e. all the functions having order of growth greater than or equal to linear.
  • If f(n)=Ω(g(n)) then g(n)=O(f(n))f(n) = \Omega(g(n)) \space \small{then} \space g(n) = O(f(n)) .

Big 𝛳 Notation

f(n)=Θ(g(n))f(n) = \Theta(g(n)) iff there are exact constants c1,c2,n0c_1, c_2, n_0 such that 0c1g(n)f(n)c2g(n) forall nn00 \leqslant c_1g(n) \leqslant f(n) \leqslant c_2g(n) \space \small{for all} \space n \geqslant n_0 .
  • Big Theta gives the exact bound on the order of growth of a function.
  • Example - For f(n)=2n+3=Θ(n)f(n) = 2n + 3 = \Theta(n)
  • If f(n)=Θ(g(n))f(n) = \Theta(g(n)) then
    f(n)=O(g(n)) and f(n)=Ω(g(n))g(n)=O(f(n)) and g(n)=Ω(f(n))f(n) = O(g(n)) \text { and } f(n) = \Omega(g(n)) \\ g(n) = O(f(n)) \text { and } g(n) = \Omega(f(n))
  • {n24,n22,n2,4n2,...}Θ(n2)\{\frac{n^2}{4}, \frac{n^2}{2}, n^2, 4n^2, ...\} \in \Theta(n^2)
  • We should use big 𝛳 notation whenever possible.